Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 2 de 2
Filter
Add more filters










Database
Language
Publication year range
1.
J Emerg Med ; 51(6): 697-704, 2016 Dec.
Article in English | MEDLINE | ID: mdl-27618476

ABSTRACT

BACKGROUND: Reading emergent electrocardiograms (ECGs) is one of the emergency physician's most crucial tasks, yet no well-validated tool exists to measure resident competence in this skill. OBJECTIVES: To assess validity of a novel tool measuring emergency medicine resident competency for interpreting, and responding to, critical ECGs. In addition, we aim to observe trends in this skill for resident physicians at different levels of training. METHODS: This is a multi-center, prospective study of postgraduate year (PGY) 1-4 residents at five emergency medicine (EM) residency programs in the United States. An assessment tool was created that asks the physician to identify either the ECG diagnosis or the best immediate management. RESULTS: One hundred thirteen EM residents from five EM residency programs submitted completed assessment surveys, including 43 PGY-1s, 33 PGY-2s, and 37 PGY-3/4s. PGY-3/4s averaged 74.6% correct (95% confidence interval [CI] 70.9-78.4) and performed significantly better than PGY-1s, who averaged 63.2% correct (95% CI 58.0-68.3). PGY-2s averaged 69.0% (95% CI 62.2-73.7). Year-to-year differences were more pronounced in management than in diagnosis. CONCLUSIONS: Residency training in EM seems to be associated with improved ability to interpret "critical" ECGs as measured by our assessment tool. This lends validity evidence for the tool by correlating with a previously observed association between residency training and improved ECG interpretation. Resident skill in ECG interpretation remains less than ideal. Creation of this sort of tool may allow programs to assess resident performance as well as evaluate interventions designed to improve competency.


Subject(s)
Arrhythmias, Cardiac/diagnosis , Educational Measurement/methods , Electrocardiography , Emergency Medicine/standards , Internship and Residency , Myocardial Infarction/diagnosis , Clinical Competence/standards , Emergency Medicine/education , Humans , Hyperkalemia/diagnosis , Prospective Studies
2.
J Emerg Med ; 49(1): 64-9, 2015 Jul.
Article in English | MEDLINE | ID: mdl-25843930

ABSTRACT

BACKGROUND: The Emergency Medicine In-Training Examination (EMITE) is one of the few validated instruments for medical knowledge assessment of emergency medicine (EM) residents. The EMITE is administered only once annually, with results available just 2 months before the end of the academic year. An earlier predictor of EMITE scores would be helpful for educators to institute timely remediation plans. A previous single-site study found that only 69% of faculty predictions of EMITE scores were accurate. OBJECTIVE: The goal of this article was to measure the accuracy with which EM faculty at five residency programs could predict EMITE scores for resident physicians. METHODS: We asked EM faculty at five different residency programs to predict the 2014 EMITE scores for all their respective resident physicians. The primary outcome was prediction accuracy, defined as the proportion of predictions within 6% of the actual scores. The secondary outcome was prediction precision, defined as the mean deviation of predictions from the actual scores. We assessed faculty background variables for correlation with the two outcomes. RESULTS: One hundred and eleven faculty participated in the study (response rate 68.9%). Mean prediction accuracy for all faculty was 60.0%. Mean prediction precision was 6.3%. Participants were slightly more accurate at predicting scores of noninterns compared to interns. No faculty background variable correlated with the primary or secondary outcomes. Eight participants predicted scores with high accuracy (>80%). CONCLUSIONS: In this multicenter study, EM faculty possessed only moderate accuracy at predicting resident EMITE scores. A very small subset of faculty members is highly accurate.


Subject(s)
Educational Measurement , Emergency Medicine/education , Faculty, Medical , Internship and Residency , Clinical Competence , Educational Status , Forecasting/methods , Humans , Prospective Studies
SELECTION OF CITATIONS
SEARCH DETAIL
...